Allen
How Data Literacy Can Keep America Safe
On May 9, 2023, two days after a white supremacist murdered eight people at an outlet mall in Allen, Texas, Elon Musk condemned "the media" for disproportionately focusing on violence committed against people of color. This particular criticism took the form of a profoundly misleading graphic, which claimed to show that a huge majority of "interracial violent crimes" in the United States are conducted by Black people against white people, rather than the other way around. Among other problems, the chart depicted the total number of victims of crimes by race, without adjusting for the fact that there are around five and a half times as many white Americans as Black Americans. In other words, there are more white victims of "interracial crime" in America because there are more white people--period. Nevertheless, the tweet went viral, being viewed more than 14 million times and retweeted by tens of thousands of additional people.
A B-P ANN Commodity Trader
Joseph E. Collard Martingale Research Corporation 100 Allentown Pkwy., Suite 211 Allen, Texas 75002 Abstract An Artificial Neural Network (ANN) is trained to recognize a buy/sell (long/short) pattern for a particular commodity future contract. The Back Propagation of errors algorithm was used to encode the relationship between the Long/Short desired output and 18 fundamental variables plus 6 (or 18) technical variables into the ANN. Trained on one year of past data the ANN is able to predict long/short market positions for 9 months in the future that would have made $10,301 profit on an investment of less than $1000. 1 INTRODUCTION An Artificial Neural Network (ANN) is trained to recognize a long/short pattern for a particular commodity future contract. The Back-Propagation of errors algorithm was used to encode the relationship between the Long/Short desired output and 18 fundamental variables plus 6 (or 18) technical variables into the ANN. 2 NETWORK ARCHITECTURE The ANNs used were simple, feed forward, single hidden layer networks with no input units, N hidden units and one output unit. N varied from six (6) through sixteen (16) hidden units.